JuliaCall
which breaks autodiffr
and is caused by changes in Julia
upstream.ForwardDiff.jl
and ReverseDiff.jl
.Julia
function can mutate the content of its arguments which is typically not true for R
functions, the mutating functions in ForwardDiff.jl
and ReverseDiff.jl
are temporarily not wrapped.ForwardDiff.jl
and ReverseDiff.jl
.ForwardDiff.jl
and ReverseDiff.jl
, prepare to adopt some of the tests for autodiffr
. Related resources arehttps://github.com/JuliaDiff/ReverseDiff.jl/tree/master/test
Start to adapting functions in DiffTests.jl
.
DiffTests.jl
currently. Note that the mutating related functions in DiffTests.jl
are not adapted.Julia
. Apply the change in testing of automatic differentiation of rosenbrock function.ForwardDiff.jl
and ReverseDiff.jl
.JuliaCall
to facilitate the development of autodiffr
.forward_deriv
based on https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/test/DerivativeTest.jl. Found an issue caused by JuliaCall
. The problematic test will be commented out until the issue is fixed in JuliaCall
.forward_grad
based on https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/test/GradientTest.jl.ifelse
sapply
, mapply
and etc.for (i in x)
when x
is the input vector.JuliaCall
, add the commented test back.sapply
and mapply
issue and have tried many methods to solve it. Since every main function involved in sapply
and mapply
is either non-generic or internal-generic, and there seems no way to overload any needed method, the only viable method seems to define our own method, and there are advantages and disadvantages:sapply
and mapply
to make automatic differentiation working for their functions;mapply
, in fact, the usage is the same to mapply
.sapply
and mapply
and add the related tests back.for (i in x)
by using for (i in as.list(x))
instead in the function that we want to apply automatic differentiation. Although there is some performance issue with the approach and is a little inconvenience, it should work.for (i in x)
and add the related test back.ad_setup()
is not necessary any more, which is more convenient.ForwardDiff.jl
at https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/test/HessianTest.jl.ForwardDiff.jl
and ReverseDiff.jl
and doing some experiments about this.tag
argument for the Config methods to match the signature in ForwardDiff.jl
and ReverseDiff.jl
.Config
objects.Tag
to the API. But after more thinking and investigation, it is unnecessary. The Tag
type in ForwardDiff.jl
is for checking perturbation confusion. Setting it manually doesn't have any benefit. After all, if users want to disable tag checking, they could use the check argument or let the function in config objects by nothing. Tests of Tag
are useful in testing of ForwardDiff.jl
internally, which is not the case for autodiffr
. More discussions can be seen at github issue: https://github.com/Non-Contradiction/autodiffr/issues/11.config
functions, add chunk_size
argument to replace chunk
argument, as chunk_size
is the only purpose for Chunk
in ForwardDiff.jl
. More discussions can be seen at https://github.com/Non-Contradiction/autodiffr/issues/12.grad
, jacobian
and hessian
functions. Use TRUE
or FALSE
as check
instead of Val{true}()
or Val{false}()
. The change could make the wrapper functions more idiomatic for R.JuliaCall
directly.JuliaObject
in JuliaCall
. Open the issue on JuliaCall
at https://github.com/Non-Contradiction/JuliaCall/issues/53.rep.JuliaObject
method in JuliaCall
.rep.JuliaObject
in JuliaCall
. The unpassed tests are temporarily commented out.rep.JuliaObject
in JuliaCall
.JuliaObject
in JuliaCall
.assign!
to match behavior for assign in R and use it for JuliaObject
in JuliaCall
.JuliaPlain
idea to alleviate the problem that R dispatches only on the first argument, make ifelse
possible to work for JuliaObject
in JuliaCall
.ForwardDiff.jl
, create an issue and pull request to fix it.autodiffr
before the fix merged into ForwardDiff.jl
.ifelse
back which becomes possible due to the changes above.ifelse
issue and other related ones. Doing experimentation about this. And thinking about implementation in JuliaCall
.JuliaCall
.JuliaCall
.ForwardDiff.jl
.ForwardDiff.jl
.diag
issue with autodiffr
and JuliaCall
.diag
related methods implemented in JuliaCall
, i.e., is.matrix.JuliaObject
, is.array.JuliaObject
, dim.JuliaObject
. Have tests for the new functionalities in JuliaCall
.diagm
to supplement the diag
methods from JuliaCall
. It turns out very difficult to implement diag
for JuliaObject
vectors. diagm
is a clear name for this functionality.ForwardDiff.jl
at https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/test/JacobianTest.jl.DiffTests.R
.DiffResults
at https://github.com/JuliaDiff/DiffResults.jl which can save the differentiation results and can get the differentiation results of multiple orders simultaneously.ReverseDiff
API to use the wrapper functions.ReverseDiff
. As I understand, it doesn't make a big difference, and it also doesn't appear in documentations, examples and tests.AbstractTape
APIs of ReverseDiff
at http://www.juliadiff.org/ReverseDiff.jl/api/#the-abstracttape-api.AbstractTape
APIs of ReverseDiff
.AbstractTape
-related methods into current APIs of ReverseDiff
.ad_setup()
a little bit.NULL
for cfg
arguments in the APIs of reverse_grad
, reverse_jacobiann
and reverse_hessian
. This can improve the performance of the functions a little, since the Config
object is not needed in the AbstractTape
-related methods.ReverseDiff
can deal with functions with multiple arguments. Use idea of positionize
to turn idiomatic R functions with named arguments into functions with positional arguments which ReverseDiff
is easy to deal with. Related to issue #15.positionize
to deal with functions of multiple arguments into all ReverseDiff
APIs step by step.AbstractTape
-related methods.ReverseDiff
gradient from https://github.com/JuliaDiff/ReverseDiff.jl/blob/master/test/api/GradientTests.jl.JuliaObject
, namely, %*%
, determinant
and as.vector
.%*%
is only S4 generic but not S3 generic, it is difficult to overload this method for JuliaObject
. Temporarily define %x%
method to deal with this problem. And incorporate the method into tests.ReverseDiff
.matrix
function.DiffResults
related APIs.%m%
instead of %x%
as %x%
already means Kronecker product in R.zeros
and ones
to deal with the problem of creating certain type
of Julia
matrices.Jmatrix
to deal with the reshaping problem of JuliaObject
.cSums
, cMeans
, rSums
, rMeans
to deal with the problem of colSums
,
colMeans
, rowSums
and rowMeans
for JuliaObject
.JuliaObject
in JuliaCall
, which is related to autodiffr
.determinant
, mean
and solve
generics for JuliaObject
in JuliaCall
.c.JuliaObject
because of a new problem identified in det
.det
for JuliaObject
. Omit the tests
of tape-related ReverseDiff
methods with det
.ReverseDiff
pass.ReverseDiff
can also pass.t.JuliaObject
in JuliaCall
.DiffTests.R
.c.JuliaObject
and fix it.autodiffr
.solve
one in ReverseDiff
Jacobian testing currently.autodiffr
in an autodiffr
issue:
https://github.com/Non-Contradiction/autodiffr/issues/18.array
can be used instead of
matrix
and is more friendly to generics.JuliaObject
in JuliaCall
,
as.vector
, dim<-
, and aperm
.array
instead of Jmatrix
.grad
, jacobian
and hessian
are in place,
but deriv
is not.
and the function signature is grad(func, x = NULL, mode = c("forward", "reverse"), xsize = x,
chunk_size = NULL, use_tape = FALSE, compiled = FALSE, ...)
.
May need to make grad
also working for the scalar case and alias of deriv
?autodiffr
.autodiffr
interface functions more robust to things like length-1 vectors and 1x1 matrices.
The interface functions can now treat scalars as length-1 vectors and 1x1 matrices as scalar if necessary.deriv
to be as same as grad
.JuliaCall
, which deals with potential setup problem with
autodiffr
on linux.autodiffr
.JuliaCall
about using Rcpp
with JuliaCall
.autodiffr
aboutRcpp
to get access to JuliaCall
.Rcpp
interface of JuliaCall
.Rcpp
example in autodiffr
.JuliaCall
for as.vector.JuliaObject
.JuliaCall
on CRAN
.ad_variant
function exported in JuliaCall
which is
an automatic adaptation tool for normal R functions to be more suitable for AD.as.numeric
(equivalent to as.double
).as.double.JuliaObject
in JuliaCall
.ReverseDiff.jl
caused by Base.float
at https://github.com/JuliaDiff/ReverseDiff.jl/issues/107.autodiffr
for problems with as.double
.JuliaCall
to help users setup autodiffr
.JuliaCall
.JuliaCall
on CRAN
.autodiffr
depending on the new release of JuliaCall
.DiffResults
.ReverseDiff
and ForwardDiff
.DiffResults
into wrapper functions of ReverseDiff
and ForwardDiff
.DiffResults
-related APIs.DiffResults
-related APIs.y
argument in JacobianResult
to deal with the case that output doesn't have same shape with x
.DiffResults
-related APIs.DiffResult
.autodiffr
.ReverseDiff
.JuliaCall
.JuliaCall
, have several plans to improve performance further.JuliaCall
because it is at the flexibility's expense.autodiffr
into two groups, XX and makeXXFunc.
The intention of using functions will be more clear.
The documentation of the functions and arguments is also more clear.JuliaCall
into autodiffr
first.
Have a debug
argument which can be turned off to allow computations to be more performant.autodiffr
.JuliaCall
interface.JuliaCall
interface.DiffResults
-related wrapper functions.JuliaCall
interface.JuliaCall
interface.autodiffr
to be compatible with JuliaCall
in development.JuliaCall
in autodiffr
to see the performance improvement.
The improvement is obvious when the dominant part is overhead,
as in the case of unoptimized gradient functions (where the overhead may occur multiple times),
or in the case of very fast calculations.
But in some other cases it is not that obvious.JuliaCall
interface.JuliaCall
for rcopy
and sexp
of JuliaObject
,
which is another important source of overhead in autodiffr
.JuliaCall
in autodiffr
to see the performance improvement.JuliaCall
.JuliaCall
.a[i,]
style index for JuliaObject
in JuliaCall
.max
and similar generics for JuliaObject
in JuliaCall
.ad_variant
in autodiffr
.JuliaCall
and autodiffr
.JuliaCall
.JuliaCall
.JuliaCall
.JuliaCall
, but it would be quite limited.
So the performance improvement work in JuliaCall
in this phase is finished quite satisfactorily.
After more bug fixing, JuliaCall
should have another release
and autodiffr
needs to depend on the new version of JuliaCall
.isFALSE
in autodiffr
, as it doesn't exist in R versions earlier than 3.5.autodiffr
about the latest usage of interface functions.JuliaObject
in JuliaCall
to facilitate functionality of autodiffr
.is.numeric
and is.double
generics for JuliaObject
to deal with issues with
automatic differentiation for functions containing is.numeric
.is.numeric
in autodiffr
.c.JuliaObject
.c
in autodiffr
.R6
dependency of JuliaCall
to reduce overhead caused by R6
with JuliaObject
.autodiffr
.autodiffr
.autodiffr
with the latest JuliaCall
.JuliaCall
for Julia
0.7.
Now the main functionality of JuliaCall
should be usable with Julia
0.7.JuliaCall
used by autodiffr
to
be compatible with Julia
0.7.ForwardDiff.jl
works on Julia
0.7 while The ReverseDiff.jl
does not.
Have reverse
and forward
arguments in ad_setup()
to have the ability to load
ForwardDiff.jl
and ReverseDiff.jl
separately.
ad_setup(forward = TRUE)
should be okay to use on Julia
0.7.JuliaCall
on Julia
v0.7.diff.JuliaObject
in JuliaCall
so autodiffr
can deal with diff
.JuliaCall
with Julia
v0.7 on Linux, Mac and Windows.JuliaCall
for Julia
v0.7.JuliaCall
on CRAN, and have autodiffr
depend on it.ad.variant
to ad_variant
.ForwardDiff.jl
and ReverseDiff.jl
.grad
to ad_grad
, jacobian
to ad_jacobian
and etc,
change all the documentations accordingly.ad_variant
, like apply
and assignment.ad_variant
, now it is able to use checkArgs
to do some checking.julia_array
helper function to create arrays in Julia
.Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.